skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Schubert, Ulrich"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
    A bstract We showcase the calculation of the master integrals needed for the two loop mixed QCD-QED virtual corrections to the neutral current Drell-Yan process ( $$ q\overline{q} $$ q q ¯ → l + l − ). After establishing a basis of 51 master integrals, we cast the latter into canonical form by using the Magnus algorithm. The dependence on the lepton mass is then expanded such that potentially large logarithmic contributions are kept. After determining all boundary constants, we give the coefficients of the Taylor series around four space-time dimensions in terms of generalized polylogarithms up to weight four. 
    more » « less
  2. null (Ed.)
    A bstract In this paper we present a fully-differential calculation for the contributions to the partial widths H → $$ b\overline{b} $$ b b ¯ and H → $$ c\overline{c} $$ c c ¯ that are sensitive to the top quark Yukawa coupling y t to order $$ {\alpha}_s^3 $$ α s 3 . These contributions first enter at order $$ {\alpha}_s^2 $$ α s 2 through terms proportional to y t y q ( q = b, c ). At order $$ {\alpha}_s^3 $$ α s 3 corrections to the mixed terms are present as well as a new contribution proportional to $$ {y}_t^2 $$ y t 2 . Our results retain the mass of the final-state quarks throughout, while the top quark is integrated out resulting in an effective field theory (EFT). Our results are implemented into a Monte Carlo code allowing for the application of arbitrary final-state selection cuts. As an example we present differential distributions for observables in the Higgs boson rest frame using the Durham jet clustering algorithm. We find that the total impact of the top-induced (i.e. EFT) pieces is sensitive to the nature of the final-state cuts, particularly b -tagging and c -tagging requirements. For bottom quarks, the EFT pieces contribute to the total width (and differential distributions) at around the percent level. The impact is much bigger for the H → $$ c\overline{c} $$ c c ¯ channel, with effects as large as 15%. We show however that their impact can be significantly reduced by the application of jet-tagging selection cuts. 
    more » « less
  3. null (Ed.)
    A bstract A framework is presented to extract and understand decision-making information from a deep neural network (DNN) classifier of jet substructure tagging techniques. The general method studied is to provide expert variables that augment inputs (“eXpert AUGmented” variables, or XAUG variables), then apply layerwise relevance propagation (LRP) to networks both with and without XAUG variables. The XAUG variables are concatenated with the intermediate layers after network-specific operations (such as convolution or recurrence), and used in the final layers of the network. The results of comparing networks with and without the addition of XAUG variables show that XAUG variables can be used to interpret classifier behavior, increase discrimination ability when combined with low-level features, and in some cases capture the behavior of the classifier completely. The LRP technique can be used to find relevant information the network is using, and when combined with the XAUG variables, can be used to rank features, allowing one to find a reduced set of features that capture part of the network performance. In the studies presented, adding XAUG variables to low-level DNNs increased the efficiency of classifiers by as much as 30-40%. In addition to performance improvements, an approach to quantify numerical uncertainties in the training of these DNNs is presented. 
    more » « less